Rayleigh quotient

In mathematics, for a given complex Hermitian matrix M and nonzero vector x, the Rayleigh quotient[1] R(M, x), is defined as[2][3]:

R(M,x)�:= {x^{*} M x \over x^{*} x}.

For real matrices and vectors, the condition of being Hermitian reduces to that of being symmetric, and the conjugate transpose x^{*} to the usual transpose x'. Note that R(M, c x) = R(M,x) for any real scalar c \neq 0 . Recall that a Hermitian (or real symmetric) matrix has real eigenvalues. It can be shown that, for a given matrix, the Rayleigh quotient reaches its minimum value \lambda_\min (the smallest eigenvalue of M) when x is v_\min (the corresponding eigenvector). Similarly, R(M, x) \leq \lambda_\max and R(M, v_\max) = \lambda_\max. The Rayleigh quotient is used in min-max theorem to get exact values of all eigenvalues. It is also used in eigenvalue algorithms to obtain an eigenvalue approximation from an eigenvector approximation. Specifically, this is the basis for Rayleigh quotient iteration.

The range of the Rayleigh quotient is called a numerical range.

Contents

Special case of covariance matrices

A covariance matrix M can be represented as the product A' A. Its eigenvalues are positive:

M v_i = = A' A v_i = \lambda _i v_i
v_i' A' A v_i = v_i' \lambda _i v_i
 \left\| A v_i \right\|^2 = \lambda _i \left\| v_i \right\|^2
 \lambda _i = \frac{\left\| A v_i \right\|^2}{\left\| v_i \right\|^2} \geq 0.

The eigenvectors are orthogonal to one another:

M v_i = \lambda _i v_i
v _j' M v_i = \lambda _i v_j' v_i
(M v_j )' v_i = \lambda _i v_j' v_i
\lambda _j v_j ' v_i = \lambda _i v_j' v_i
(\lambda _j - \lambda _i) v_j ' v_i = 0
v_j ' v_i = 0 (different eigenvalues, in case of multiplicity, the basis can be orthogonalized).

The Rayleigh quotient can be expressed as a function of the eigenvalues by decomposing any vector x on the basis of eigenvectors:

x = \sum _{i=1} ^n \alpha _i v_i, where  \alpha_i = \frac{x'v_i}{\sqrt{v_i'v_i}} = \frac{\langle x,v_i\rangle}{|| v_i ||} is the coordinate of x orthogonally projected onto v_i
R(M,x) = \frac{x' A' A x}{x' x}
R(M,x) = \frac{(\sum _{j=1} ^n \alpha _j v_j)' A' A (\sum _{i=1} ^n \alpha _i v_i)}{(\sum _{j=1} ^n \alpha _j v_j)' (\sum _{i=1} ^n \alpha _i v_i)}

which, by orthogonality of the eigenvectors, becomes:

R(M,x) = \frac{\sum _{i=1} ^n \alpha _i ^2 \lambda _i}{\sum _{i=1} ^n \alpha _i ^2} = \sum_{i=1}^n \lambda_i \frac{(x'v_i)^2}{ (x'x)( v_i' v_i)}

In the last representation we can see that the Rayleigh quotient is the sum of the squared cosines of the angles formed by the vector x and each eigenvector v_i, weighted by corresponding eigenvalues.

If a vector x maximizes R(M,x), then any vector k . x (for k \ne 0) also maximizes it, one can reduce to the Lagrange problem of maximizing \sum _{i=1} ^n \alpha _i ^2 \lambda _i under the constraint that \sum _{i=1} ^n \alpha _i ^2 = 1.

Since all the eigenvalues are non-negative, the problem is convex and the maximum occurs on the edges of the domain, namely when \alpha _1 = 1 and \forall i > 1, \alpha _i = 0 (when the eigenvalues are ordered in decreasing magnitude).

Alternatively, this result can be arrived at by the method of Lagrange multipliers. The problem is to find the critical points of the function

R(M,x) = x^T M x ,

subject to the constraint \|x\|^2 = x^Tx = 1. I.e. to find the critical points of

\mathcal{L}(x) = x^T M x  -\lambda (x^Tx - 1),

where \lambda is a Lagrange multiplier. The stationary points of \mathcal{L}(x) occur at

\frac{d\mathcal{L}(x)}{dx} = 0
\therefore 2x^T M^T < - 2\lambda x^T = 0
\therefore M x = \lambda x

and  R(M,x) = \frac{x^T M x}{x^T x} = \lambda \frac{x^Tx}{x^T x} = \lambda.

Therefore, the eigenvectors x_1 \ldots x_n of M are the critical points of the Raleigh Quotient and their corresponding eigenvalues \lambda_1 \ldots \lambda_n are the stationary values of R.

This property is the basis for principal components analysis and canonical correlation.

Use in Sturm–Liouville theory

Sturm–Liouville theory concerns the action of the linear operator

L(y) = \frac{1}{w(x)}\left(-\frac{d}{dx}\left[p(x)\frac{dy}{dx}\right] %2B q(x)y\right)

on the inner product space defined by

\langle{y_1,y_2}\rangle = \int_a^b w(x)y_1(x)y_2(x) \, dx

of functions satisfying some specified boundary conditions at a and b. In this case the Rayleigh quotient is

\frac{\langle{y,Ly}\rangle}{\langle{y,y}\rangle} = \frac{\int_a^b{y(x)\left(-\frac{d}{dx}\left[p(x)\frac{dy}{dx}\right] %2B q(x)y(x)\right)}dx}{\int_a^b{w(x)y(x)^2}dx}.

This is sometimes presented in an equivalent form, obtained by separating the integral in the numerator and using integration by parts:

\frac{\langle{y,Ly}\rangle}{\langle{y,y}\rangle} = \frac{\int_a^b{y(x)\left(-\frac{d}{dx}\left[p(x)y'(x)\right]\right)}dx %2B \int_a^b{q(x)y(x)^2} \, dx}{\int_a^b{w(x)y(x)^2} \, dx}
= \frac{-y(x)\left[p(x)y'(x)\right]|_a^b %2B \int_a^b{y'(x)\left[p(x)y'(x)\right]} \, dx %2B \int_a^b{q(x)y(x)^2} \, dx}{\int_a^b{w(x)y(x)^2} \, dx}
= \frac{-p(x)y(x)y'(x)|_a^b %2B \int_a^b\left[p(x)y'(x)^2 %2B q(x)y(x)^2\right] \, dx}{\int_a^b{w(x)y(x)^2} \, dx}.

Generalization

For a given pair (A, B) of real symmetric positive-definite matrices, and a given non-zero vector x, the generalized Rayleigh quotient is defined as:

R(A,B; x)�:= \frac{x^T A x}{x^T B x}.

The Generalized Rayleigh Quotient can be reduced to the Rayleigh Quotient R(D, Cx) through the transformation D = {C^*}^{-1} A C^{-1} where C is the Cholesky decomposition of matrix B.

See also

References

  1. ^ Also known as the Rayleigh–Ritz ratio; named after Walther Ritz and Lord Rayleigh.
  2. ^ Horn, R. A. and C. A. Johnson. 1985. Matrix Analysis. Cambridge University Press. pp. 176–180.
  3. ^ Parlet B. N. The symmetric eigenvalue problem, SIAM, Classics in Applied Mathematics,1998

Further reading